1 Backpropagation : The Basic Theory
نویسندگان
چکیده
Since the publication of the PDP volumes in 1986,1 learning by backpropagation has become the most popular method of training neural networks. The reason for the popularity is the underlying simplicity and relative power of the algorithm. Its power derives from the fact that, unlike its precursors, the perceptron learning rule and the Widrow-Hoff learning rule, it can be employed for training nonlinear networks of arbitrary connectivity. Since such networks are often required for real-world applications, such a learning procedure is critical. Nearly as important as its power in explaining its popularity is its simplicity. The basic igea is old and simple; namely define an error function and use hill climbing (or gradient descent if you prefer going downhill) to find a set of weights which optimize performance on a particular task. The algorithm is so simple that it can be implemented in a few lines' of code, and there have been no doubt many thousands of implementations of the algorithm by now. The name back propagation actually comes from the term employed by Rosenblatt (1962) for his attempt to generalize the perceptron learning algorithm to the multilayer case. There were many attempts to generalize the perceptron learning procedure to multiple layers during the 1960s and 1970s, but none of them were especially successful. There appear to have been at least three independent inventions of the modem version of the back-propagation algorithm: Paul Werbos developed the basic idea in 1974 in a Ph.D. dissertation entitled
منابع مشابه
Backpropagation and his application in ECG classification
We show on an example from medical diagnosis that some problems can be solved using simple neural networks. First we define some basic notions from neural network theory. We mention also some basic facts about electrocardiography. Then we use three-layered neural network with backpropagation algorithm to adaptation on classification the patients' ECG signals into two classes and summarize results.
متن کامل30 Years of Adaptive Neural Networks : Perceptron , Madaline , and Backpropagation
Fundamental developments in feedfonvard artificial neural networks from the past thirty years are reviewed. The central theme of this paper is a description of the history, origination, operating characteristics, and basic theory of several supervised neural network training algorithms including the Perceptron rule, the LMS algorithm, three Madaline rules, and the backpropagation technique. The...
متن کاملH 1 Optimality Criteria for Lms and Backpropagation
We have recently shown that the widely known LMS algorithm is an H 1 optimal estimator. The H 1 criterion has been introduced, initially in the control theory literature, as a means to ensure robust performance in the face of model uncertainties and lack of statistical information on the exogenous signals. We extend here our analysis to the nonlinear setting often encountered in neural networks...
متن کاملComputing Iterative Roots with Neural Networks
Many real processes are composed of a n-fold repetition of some simpler process. If the whole process can be modelled with a neural network, we present a method to derive a model of the basic process, too, thus performing not only a systemidentification but also a decomposition into basic blocks. Mathematically this is equivalent to the problem of computing iterative or functional roots: Given ...
متن کاملInvestigation of Mechanical Properties of Self Compacting Polymeric Concrete with Backpropagation Network
Acrylic polymer that is highly stable against chemicals and is a good choice when concrete is subject to chemical attack. In this study, self-compacting concrete (SCC) made using acrylic polymer, nanosilica and microsilica has been investigated. The results of experimental testing showed that the addition of microsilica and acrylic polymer decreased the tensile, compressive and bending strength...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2008